skip to main content


Search for: All records

Creators/Authors contains: "Wood, A."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract

    Coastal marshes mitigate allochthonous nitrogen (N) inputs to adjacent marine habitat; however, their extent is declining rapidly. As a result, marsh restoration and construction have become a major foci of wetland management. Constructed marshes can quickly reach similar plant biomass to natural marshes, but biogeochemical functions like N removal and retention can take decades to reach functional equivalency, often due to lags in organic matter (OM) pools development in newly constructed marshes. We compared denitrification and dissimilatory nitrate reduction to ammonium (DNRA) rates in a 32 year-old constructed marsh and adjacent reference marsh in the Northern Gulf of Mexico. Marsh sediments packed into 3 mm “thin discs” were subjected to three OM quality treatments (no OM addition, labile OM, or recalcitrant OM) and two N treatments (ambient nitrate or elevated nitrate) during a 13 day incubation. We found that OM addition, rather than marsh type or nitrate treatment, was the most important driver of nitrate reduction, increasing both denitrification and DNRA and promoting DNRA over denitrification in both marshes. Fungal and bacterial biomass were higher in the natural marsh across treatments, but recalcitrant OM increased fungal biomass in the constructed marsh, suggesting OM-limitation of fungal growth. We found that constructed marshes are capable of similar denitrification and DNRA as natural marshes after 30 years, and that labile OM addition promotes N retention in both natural and constructed marshes.

    Graphical Abstract

    Conceptual figure highlighting the findings of this experiment. Under control treatment with no C addition (bottom panel), constructed and natural marshes have similar rates of both DNRA and denitrification. The natural marsh has higher fungal and bacterial biomass, while fungal biomass is not detectable in the constructed marsh. Under labile OM additions (upper left panel), rates of both DNRA and denitrification are increased and DNRA becomes favored over denitrification in both marshes. Recalcitrant OM additions (upper right) increase denitrification, but do not affect DNRA or % denitrification. The addition of recalcitrant OM also increases the detectability of fungal biomass in the constructed marsh.

     
    more » « less
  2. Abstract

    A mass‐conserving method to downscale precipitation from global climate models (GCMs) using sub‐grid‐scale topography and modeled 700‐mb wind direction is presented. Runoff is simulated using a stand‐alone hydrological model, with this and several other methods as inputs, and compared to runoff simulated using historical observations over the western contiguous United States. Results suggest the mitigation of grid‐scale biases is more critical than downscaling for some regions with large wet biases (e.g., the Great Basin and Upper Colorado). In other regions (e.g., the Pacific Northwest) the new method produces more realistic sub‐grid‐scale variability in runoff compared to unadjusted GCM output and a simpler downscaling method. The presented method also brings the runoff centroid timing closer to that simulated with observations for all subregions examined.

     
    more » « less
  3. Abstract

    Despite the proliferation of computer‐based research on hydrology and water resources, such research is typically poorly reproducible. Published studies have low reproducibility due to incomplete availability of data and computer code, and a lack of documentation of workflow processes. This leads to a lack of transparency and efficiency because existing code can neither be quality controlled nor reused. Given the commonalities between existing process‐based hydrologic models in terms of their required input data and preprocessing steps, open sharing of code can lead to large efficiency gains for the modeling community. Here, we present a model configuration workflow that provides full reproducibility of the resulting model instantiations in a way that separates the model‐agnostic preprocessing of specific data sets from the model‐specific requirements that models impose on their input files. We use this workflow to create large‐domain (global and continental) and local configurations of the Structure for Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model connected to the mizuRoute routing model. These examples show how a relatively complex model setup over a large domain can be organized in a reproducible and structured way that has the potential to accelerate advances in hydrologic modeling for the community as a whole. We provide a tentative blueprint of how community modeling initiatives can be built on top of workflows such as this. We term our workflow the “Community Workflows to Advance Reproducibility in Hydrologic Modeling” (CWARHM; pronounced “swarm”).

     
    more » « less
  4. Abstract The Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/ c charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1 $$\pm 0.6$$ ± 0.6 % and 84.1 $$\pm 0.6$$ ± 0.6 %, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation. 
    more » « less
    Free, publicly-accessible full text available July 1, 2024
  5. Free, publicly-accessible full text available June 1, 2024
  6. Free, publicly-accessible full text available May 1, 2024
  7. Abstract The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype. 
    more » « less